Correlation structure of stochastic neural networks with generic connectivity matrices
نویسندگان
چکیده
Using a perturbative expansion for weak synaptic weights and weak sources of randomness, we calculate the correlation structure of neural networks with generic connectivity matrices. In detail, the perturbative parameters are the mean and the standard deviation of the synaptic weights, together with the standard deviations of the background noise of the membrane potentials and of their initial conditions. We also show how to determine the correlation structure of the system when the synaptic connections have a random topology. This analysis is performed on rate neurons described by Wilson and Cowan equations, since this allows us to find analytic results. Moreover, the perturbative expansion can be developed at any order and for a generic connectivity matrix. We finally show an example of application of this technique for a particular case of biologically relevant topology of the synaptic connections.
منابع مشابه
Finite size effects in the correlation structure of stochastic neural networks: analysis of different connectivity matrices and failure of the mean-field theory
We quantify the finite size effects in a stochastic network made up of rate neurons, for several kinds of recurrent connectivity matrices. This analysis is performed by means of a perturbative expansion of the neural equations, where the perturbative parameters are the intensities of the sources of randomness in the system. In detail, these parameters are the variances of the background or inpu...
متن کاملRobust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays
In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...
متن کاملA new virtual leader-following consensus protocol to internal and string stability analysis of longitudinal platoon of vehicles with generic network topology under communication and parasitic delays
In this paper, a new virtual leader following consensus protocol is introduced to perform the internal and string stability analysis of longitudinal platoon of vehicles under generic network topology. In all previous studies on multi-agent systems with generic network topology, the control parameters are strictly dependent on eigenvalues of network matrices (adjacency or Laplacian). Since some ...
متن کاملOn the correlation dimension of recurrent neural networks
Recurrent sigmoidal neural networks with asymmetric weight matrices and recurrent neural networks with nonmonotone transfer functions can exhibit ongoing uctuations rather than settling into point attractors. It is, however, an open question if these uctuations are the sign of low dimensional chaos or if they can be considered as close to stochastic. We report on the calculation of the correlat...
متن کاملInitialization and self-organized optimization of recurrent neural network connectivity.
Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. Networks in RC have a sparsely and randomly connected fixed hidden layer, and only output connections are trained. RC networks have recently received increased attention as a mathematical model for generic neural microcircuits to investigate and explain computations in neocortical columns. Applied to specif...
متن کامل